Sliced Inverse Regression for Dimension Reduction

نویسنده

  • KER-CHAU LI
چکیده

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A note on shrinkage sliced inverse regression

We employ Lasso shrinkage within the context of sufficient dimension reduction to obtain a shrinkage sliced inverse regression estimator, which provides easier interpretations and better prediction accuracy without assuming a parametric model. The shrinkage sliced inverse regression approach can be employed for both single-index and multiple-index models. Simulation studies suggest that the new...

متن کامل

Sufficient dimension reduction in regressions across heterogeneous subpopulations

Sliced inverse regression is one of the widely used dimension reduction methods. Chiaromonte and co-workers extended this method to regressions with qualitative predictors and developed a method, partial sliced inverse regression, under the assumption that the covariance matrices of the continuous predictors are constant across the levels of the qualitative predictor. We extend partial sliced i...

متن کامل

Sliced Inverse Moment Regression Using Weighted Chi-Squared Tests for Dimension Reduction∗

We propose a new class of dimension reduction methods using the first two inverse moments, called Sliced Inverse Moment Regression (SIMR). We develop corresponding weighted chi-squared tests for the dimension of the regression. Basically, SIMR are linear combinations of Sliced Inverse Regression (SIR) and the method using a new candidate matrix which is designed to recover the entire inverse se...

متن کامل

Consistency of regularized sliced inverse regression for kernel models

We develop an extension of the sliced inverse regression (SIR) framework for dimension reduction using kernel models and Tikhonov regularization. The result is a numerically stable nonlinear dimension reduction method. We prove consistency of the method under weak conditions even when the reproducing kernel Hilbert space induced by the kernel is infinite dimensional. We illustrate the utility o...

متن کامل

Likelihood-based Sufficient Dimension Reduction

We obtain the maximum likelihood estimator of the central subspace under conditional normality of the predictors given the response. Analytically and in simulations we found that our new estimator can preform much better than sliced inverse regression, sliced average variance estimation and directional regression, and that it seems quite robust to deviations from normality.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010